Chi-Squared Distance Metric Learning for Histogram Data

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Quadratic-Chi Histogram Distance Family

We present a new histogram distance family, the Quadratic-Chi (QC). QC members are Quadratic-Form distances with a cross-bin χ-like normalization. The cross-bin χ-like normalization reduces the effect of large bins having undo influence. Normalization was shown to be helpful in many cases, where the χ histogram distance outperformed the L2 norm. However, χ is sensitive to quantization effects, ...

متن کامل

The Quadratic-Chi Histogram Distance Family - Appendices

This document contains the appendices for the paper “The Quadratic-Chi Histogram Distance Family” [1], proofs and additional results. In section 2 we prove that all Quadratic-Chi histogram distances are continuous. In section 3 we prove that EMD, ÊMD and all Quadratic-Chi histogram distances are Similarity-Matrix-QuantizationInvariant. In section 4 we present additional shape classification res...

متن کامل

CHI-SQUARED DISTANCE AND METAMORPHIC VIRUS DETECTION A Thesis

CHI-SQUARED DISTANCE AND METAMORPHIC VIRUS DETECTION by Annie H. Toderici Malware are programs that are designed with a malicious intent. Metamorphic malware change their internal structure each generation while still maintaining their original behavior. As metamorphic malware become more sophisticated, it is important to develop efficient and accurate detection techniques. Current commercial a...

متن کامل

Chi-squared: A simpler evaluation function for multiple-instance learning

This paper introduces a new evaluation function for solving the multiple instance problem. Our approach makes use of the main idea of diverse density (Maron, 1998; Maron & LozanoPérez, 1998) but finds the best concept using the chi-square statistic. This approach is simpler than diverse density and allows us to search more extensively by using properties of the contingency table to prune in a g...

متن کامل

Deep Distance Metric Learning with Data Summarization

We present Deep Stochastic Neighbor Compression (DSNC), a framework to compress training data for instance-based methods (such as k-nearest neighbors). We accomplish this by inferring a smaller set of pseudo-inputs in a new feature space learned by a deep neural network. Our framework can equivalently be seen as jointly learning a nonlinear distance metric (induced by the deep feature space) an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Problems in Engineering

سال: 2015

ISSN: 1024-123X,1563-5147

DOI: 10.1155/2015/352849